9 research outputs found

    Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic

    Get PDF
    Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained

    Probabilistic, high-resolution tsunami predictions in northern Cascadia by exploiting sequential design for efficient emulation

    Get PDF
    The potential of a full-margin rupture along the Cascadia subduction zone poses a significant threat over a populous region of North America. Previous probabilistic tsunami hazard assessment studies produced hazard curves based on simulated predictions of tsunami waves, either at low resolution or at high resolution for a local area or under limited ranges of scenarios or at a high computational cost to generate hundreds of scenarios at high resolution. We use the graphics processing unit (GPU)-accelerated tsunami simulator VOLNA-OP2 with a detailed representation of topographic and bathymetric features. We replace the simulator by a Gaussian process emulator at each output location to overcome the large computational burden. The emulators are statistical approximations of the simulator's behaviour. We train the emulators on a set of input–output pairs and use them to generate approximate output values over a six-dimensional scenario parameter space, e.g. uplift/subsidence ratio and maximum uplift, that represent the seabed deformation. We implement an advanced sequential design algorithm for the optimal selection of only 60 simulations. The low cost of emulation provides for additional flexibility in the shape of the deformation, which we illustrate here considering two families – buried rupture and splay-faulting – of 2000 potential scenarios. This approach allows for the first emulation-accelerated computation of probabilistic tsunami hazard in the region of the city of Victoria, British Columbia

    Biophysical models of persistent connectivity and barriers on the northern Mid-Atlantic Ridge

    Get PDF
    A precautionary approach to protecting biodiversity on mid-ocean ridges, while permitting seabed mining, is to design and implement a network of areas protected from the effects of mining. Such a network should capture representative populations of vent endemic fauna within regions of connectivity and across persistent barriers, but determining where such connectivity and barriers exist is challenging. A promising approach is to use biophysical modeling to infer the spatial scale of dispersal and the positions where breaks in hydrographic connectivity occur. We use results from a deep-sea biophysical model driven by data from the global array of Argo probes for depths of 1000 m to estimate biophysical connectivity among fragmented hydrothermal vent habitats along the Mid-Atlantic Ridge, from the equator northward to the Portuguese Exclusive Economic Zone surrounding the Azores. The spatial scale of dispersal varies along the ridge axis, with median dispersal distances for planktonic larval durations (PLDs) of 75 d ranging from 67 km to 304 km. This scale of dispersal leads to considerable opportunities for connectivity through mid-water dispersal. A stable pattern of five regions of biophysical connectivity was obtained for PLDs of 100 d or more. Connectivity barriers between these regions can persist even when planktonic larval duration extends beyond 200 d. For a 50 d PLD, one connectivity barrier coincides with the region of the genetic hybrid zone for northern and southern vent mussel species at the Broken Spur vent field. Additional barriers suggest potential for genetic differentiation that so far has not been detected for any taxon. The locations of persistent zones of connectivity and barriers to dispersal suggest that there may be multiple biogeographic subunits along the northern Mid-Atlantic Ridge that should be taken into account in planning for effective environmental management of human activities

    Probabilistic Landslide-Generated Tsunamis in the Indus Canyon, NW Indian Ocean, Using Statistical Emulation

    Get PDF
    The Indus Canyon in the northwestern Indian Ocean has been reported to be the site of numerous submarine mass failures in the past. This study is the first to investigate potential tsunami hazards associated with such mass failures in this region. We employed statistical emulation, i.e. surrogate modelling, to efficiently quantify uncertainties associated with slump-generated tsunamis at the slopes of the canyon. We simulated 60 slump scenarios with thickness of 100–300 m, width of 6–10.5 km, travel distances of 500–2000 m and submergence depth of 250–450 m. These scenarios were then used to train the emulator and predict 500,000 trial scenarios in order to study probabilistically the tsunami hazard over the near field. Due to narrow–deep canyon walls and the shallow continental shelf in the adjacent regions ( <100 m water depth), the tsunami propagation has a unique pattern as an ellipse stretched in the NE–SW direction. The results show that the most likely tsunami amplitudes and velocities are approximately 0.2–1.0 m and 2.5–13 m/s, respectively, which can potentially impact vessels and maritime facilities. We demonstrate that the emulator-based approach is an important tool for probabilistic hazard analysis since it can generate thousands of tsunami scenarios in few seconds, compared to days of computations on High Performance Computing facilities for a single run of the dispersive tsunami solver that we use here

    Rheological considerations for the modelling of submarine sliding at Rockall Bank, NE Atlantic Ocean

    Get PDF
    Recent scientific research indicates that the Rockall Bank Slide Complex in the NE Atlantic Ocean has formed as the result of repetitive slope failures that can be distinguished in at least three major phases. These sliding episodes took place during and before the Last Glacial Maximum. This work attempts the modelling of each sliding episode with the incorporation of the landslide’s rheological properties. The objective is to study the landslide kinematics and final deposition of each episode under a rheological framework that comes in agreement with the field observations. To do so in the present work, we use different types of rheological models to compute the total retarding stress and simulate submarine failure. The Bingham rheology and the frictional rheology are used to model the flow behavior. The scope of this approach is to understand the effect of the two classical laws in landslide kinematics. A rheological model that combines the two regimes is also used. To account for the hydrodynamic drag, the Voellmy model is employed. The results are validated against the field observations on the seabed of the Rockall Trough. The simulations show that for this particular case the Bingham rheology with a small or negligible basal friction produces the best results. The tsunamigenic potential of the episodes is also briefly examined

    Impact of future tsunamis from the Java trench on household welfare: Merging geophysics and economics through catastrophe modelling

    Get PDF
    This paper presents the first end-to-end example of a risk model for loss of assets in households due to possible future tsunamis. There is a significant need for Government to assess the generic risk to buildings, and the concrete impact on the full range of assets of households, including the ones that are key to livelihoods such as agricultural land, fishing boats, livestock and equipment. Our approach relies on the Oasis Loss Modelling Framework to integrate hazard and risk. We first generate 25 representative events of tsunamigenic earthquakes off the Southern coast of Java, Indonesia. We then create a new vulnerability function based upon the Indonesian household survey STAR1 of how much assets have been reduced in each household after the 2004 tsunami. We run a multinomial logit regression to precisely allocate the probabilistic impacts to bins that correspond with levels of financial reduction in assets. We focus on the town of Cilacap for which we build loss exceedance curves, which represent the financial losses that may be exceeded at a range of future timelines, using future tsunami inundations over a surveyed layout and value of assets over the city. Our loss calculations show that losses increase sharply, especially for events with return periods beyond 250 years. These series of computations will allow more accurate investigations of impacts on livelihoods and thus will help design mitigation strategies as well as policies to minimize suffering from tsunamis.Lloyd's Tercentenary Research Foundation; Lighthill Risk Network; Alan Turing Institute project "Uncertainty Quantification of multi-scale and multiphysics computer models: applications to hazard and climate models", EPSRC EP/N510129/1; Royal Society, the United Kingdom CHL/R1/180173

    Performance analysis of Volna-OP2 – massively parallel code for tsunami modelling

    Get PDF
    The software package Volna-OP2 is a robust and efficient code capable of simulating the complete life cycle of a tsunami whilst harnessing the latest High Performance Computing (HPC) architectures. In this paper, a comprehensive error analysis and scalability study of the GPU version of the code is presented. A novel decomposition of the numerical errors into the dispersion and dissipation components is explored. Most tsunami codes exhibit amplitude smearing and/or phase lagging/leading, so the decomposition shown here is a new approach and novel tool for explaining these occurrences. To date, Volna-OP2 has been widely used by the tsunami modelling community. In particular its computational efficiency has allowed various sensitivity analyses and uncertainty quantification studies. Due to the number of simulations required, there is always a trade-off between accuracy and runtime when carrying out these statistical studies. The analysis presented in this paper will guide the user towards an acceptable level of accuracy within a given runtime
    corecore